Higher Order Derivatives in Costa's Entropy Power Inequality

نویسندگان

  • Fan Cheng
  • Yanlin Geng
چکیده

Let X be an arbitrary continuous random variable and Z be an independent Gaussian random variable with zero mean and unit variance. For t > 0, Costa proved that e √ tZ) is concave in t, where the proof hinged on the first and second order derivatives of h(X + √ tZ). Specifically, these two derivatives are signed, i.e., ∂ ∂t h(X + √ tZ) ≥ 0 and ∂ 2 ∂t2 h(X + √ tZ) ≤ 0. In this paper, we show that the third order derivative of h(X + √ tZ) is nonnegative, which implies that the Fisher information J(X + √ tZ) is convex in t. We further show that the fourth order derivative of h(X + √ tZ) is nonpositive. Following these results, we make two conjectures on h(X+ √ tZ): the first is that ∂ n ∂tn h(X+ √ tZ) is nonnegative in t if n is odd, and nonpositive otherwise; the second is that log J(X + √ tZ) is convex in t. The concavity of h( √ tX + √ 1− tZ) is studied, revealing its connection with Costa’s EPI.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A multivariate generalization of Costa's entropy power inequality

A simple multivariate version of Costa’s entropy power inequality is proved. In particular, it is shown that if independent white Gaussian noise is added to an arbitrary multivariate signal, the entropy power of the resulting random variable is a multidimensional concave function of the individual variances of the components of the signal. As a side result, we also give an expression for the He...

متن کامل

A Counterexample to the Vector Generalization of Costa's EPI, and Partial Resolution

We give a counterexample to the vector generalization of Costa’s entropy power inequality (EPI) due to Liu, Liu, Poor and Shamai. In particular, the claimed inequality can fail if the matix-valued parameter in the convex combination does not commute with the covariance of the additive Gaussian noise. Conversely, the inequality holds if these two matrices commute. For a random vector X with dens...

متن کامل

The Entropy Power Inequality and Mrs. Gerber's Lemma for Abelian Groups of Order 2^n

Shannon’s Entropy Power Inequality can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The entropy power inequality has played a key role in resolving a number of problems in information theory. It is therefore interesting to examine the existence of a similar inequality for discrete random...

متن کامل

On the entropy power inequality for the Rényi entropy of order [0, 1]

Using a sharp version of the reverse Young inequality, and a Rényi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive a Rényi entropy power inequality for log-concave random vectors when Rényi parameters belong to (0, 1). Furthermore, the estimates are shown to be somewhat sharp.

متن کامل

The Entropy Power Inequality and Mrs. Gerber's Lemma for groups of order 2n

Shannon’s Entropy Power Inequality (EPI) can be viewed as characterizing the minimum differential entropy achievable by the sum of two independent random variables with fixed differential entropies. The EPI is a powerful tool and has been used to resolve a number of problems in information theory. In this paper we examine the existence of a similar entropy inequality for discrete random variabl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Information Theory

دوره 61  شماره 

صفحات  -

تاریخ انتشار 2015